Input output stability of recurrent neural networks

نویسنده

  • Jochen J. Steil
چکیده

i Foreword Recurrent neural networks are an attractive tool for both practical applications and for the modeling of biological nerve nets, but their successful application requires an understanding of their dynamical properties, in particular, their stability. The present work provides an in-depth study of this challenging issue and contributes a number of new results that are also important for a broader class of recurrent systems containing nonlinear and even time-delayed feedback. The approach is based on modern concepts of control theory, with an emphasis on techniques that have been developed for the analysis of feedback systems with parameter uncertainties during recent years. In addition to the analytic derivations, the author demonstrates how the derived criteria can be numerically evaluated with modern techniques for quadratic optimization. Some of the techniques are then illustrated for the example of using a fully recurrent network for learning the dynamics of a chaotic system. The present monograph will offer the mathematically inclined reader an unusual, but powerful approach to the stability analysis of recurrent systems and acquaint him with many advanced concepts that he may find useful for his own research. Acknowledgement The presented work was mainly carried out in the neural network group of the Depart-Dynamik rekurrenter neuronaler Netze ". First of all I would like to thank Helge, who's scientific enthusiasm attracted me to the group a number of years ago and who's optimism always remained to be a powerful inspiration throughout my work. Without his personal confidence and scientific support this work would not have been settled. A second root of the presented work lies in Russia. This is true regarding content, because a lot of techniques I use in this thesis were first considered there, but it is even more the case for me personally. Supported by a DAAD grant, in 1995/96 I had the opportunity to spend one year at the former Electrotechnical Institute in St. Petersburg, which is situated in – No. 5, Professor Popov street ! There I met Prof. Dr. I. B. Junger, who first acquainted me with frequency and input-output methods. Without his help and the support from Dr. Oleg Gerasimov I could not have reached the level of understanding in " frequency theory " – as they call it – which was necessary for the current work. Since the year 1995 this connection never was cut and recently even reached a new level, because Prof. …

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays

In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...

متن کامل

Input-Output Stability of Recurrent Neural Networks with Time-Varying Parameters

We provide input-output stability conditions for additive recurrent neural networks regarding them as dynamical operators between their input and output function spaces. The stability analysis is based on methods from non-linear feedback system theory and includes the case of time-varying weights, for instance introduced by on-line adaptation. The results assure that there are regions in weight...

متن کامل

Input - Output Stability of Recurrent Neural

We present a frequency domain analysis of additive recurrent neural networks based on the pas-sivity approach to input-output stability. We apply graphical Circle Criteria for the case of normal weight matrices which result in eeectively computable stability bounds, including systems with delay. Approximation techniques yield further gen-eralisation to arbitrary matrices.

متن کامل

Local input-output stability of recurrent networks with time-varying weights

We present local conditions for input-output stability of recurrent neural networks with time-varying parameters introduced for instance by noise or on-line adaptation. The conditions guarantee that a network implements a proper mapping from time-varying input to time-varying output functions using a local equilibrium as point of operation. We show how to calculate necessary bounds on the allow...

متن کامل

Nl Q Theory: Checking and Imposing Stability of Recurrent Neural Networks for Nonlinear Modelling

It is known that many discrete time recurrent neural networks, such as e.g. neural state space models, multilayer Hoppeld networks and locally recurrent globally feedforward neural networks, can be represented as NL q systems. Suucient conditions for global asymptotic stability and input/output stability of NL q systems are available, including three types of criteria: diagonal scaling and crit...

متن کامل

Monotonic Recurrent Bounded Derivative Neural Network

Neural networks applied in control loops and safety-critical domains have to meet hard requirements. First of all, a small approximation error is required, then, the smoothness and the monotonicity of selected input-output relations have to be taken into account and finally, for some processes, time dependencies in time series should be induced into the model. If not then the stability of the c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999